141 research outputs found

    Evaluation of deep learning with long short-term memory networks for time series forecasting in supply chain management

    Get PDF
    Abstract Performance analysis and forecasting the evolution of complex systems are two challenging tasks in manufacturing. Time series data from complex systems capture the dynamic behaviors of the underlying processes. However, non-linear and non-stationary dynamics pose a major challenge for accurate forecasting. To overcome statistical complexities through analyzing time series, we approach the problem with deep learning methods. In this paper, we mainly focus on the long short-term memory (LSTM) networks for demand forecasts in supply chain management, where the future demand for a certain product is the basis for the respective replenishment systems. This study contributes to the literature by conducting experiments on real data to investigate the potential of using LSTM networks for final customer demand forecasting, and hence for increasing the overall value generated by a supply chain. Both forward LSTM and bidirectional LSTM (forward-backward) for short- and long-term demand prediction in supply chain management are considered in this study

    logistic regression and response surface design for statistical modeling of investment casting process in metal foam production

    Get PDF
    Abstract A metal foam represents a promising material since it keeps the high mechanical properties of the metal while reducing the weight up to 90%. Among several manufacturing processes, the investment casting is a foundry process flexible enough to be suitable both for stochastic and for regular foams. This paper presents an experimental determination of the manufacturing process of metal regular foams by investment casting. The goal is to derive experimentally an actual formability map. The use of logistic regression and response surface design is proposed as an effective tool for determining a statistical model of the metal foam casting process

    Binary Gaussian Process classification of quality in the production of aluminum alloys foams with regular open cells

    Get PDF
    none2noAluminum alloys foams with homogeneous and regular open cells have been frequently proposed and used as support structures for catalytic applications. In this kind of application, the quality of produced metal foam assumes primary importance. This paper presents an application of a classifier algorithm to predict quality in the manufacturing process of aluminum alloy foams with homogeneous and regular open cells. A data analysis methodology of experimental data, which is based on Binary Gaussian Process Classification, is presented. The proposed method is a Bayesian classification method, which gets away from any assumptions about the relationship between process inputs (the geometric design parameters of the regular unit cells) and process output (probability to obtain defective foam). We demonstrate that the proposed methodology can provide an effective tool to derive a model for the prediction of quality. An investment casting process, via 3D printing of wax patterns, is considered throughout the paper. Despite this specific case study, the methodology can be exploited in different processes in which the assumptions of traditional statistical approaches could not be easily verified, e.g., additive manufacturing.openAnglani A.; Pacella M.Anglani, A.; Pacella, M

    High-dimensional statistics for complex data

    Get PDF
    2016 - 2017High dimensional data analysis has become a popular research topic in the recent years, due to the emergence of various new applications in several fields of sciences underscoring the need for analysing massive data sets. One of the main challenge in analysing high dimensional data regards the interpretability of estimated models as well as the computational efficiency of procedures adopted. Such a purpose can be achieved through the identification of relevant variables that really affect the phenomenon of interest, so that effective models can be subsequently constructed and applied to solve practical problems. The first two chapters of the thesis are devoted in studying high dimensional statistics for variable selection. We firstly introduce a short but exhaustive review on the main developed techniques for the general problem of variable selection using nonparametric statistics. Lastly in chapter 3 we will present our proposal regarding a feature screening approach for non additive models developed by using of conditional information in the estimation procedure... [edited by Author]XXX cicl

    Multisensor data fusion via Gaussian process models for dimensional and geometric verification

    Get PDF
    An increasing amount of commercial measurement instruments implementing a wide range of measurement technologies is rapidly becoming available for dimensional and geometric verification. Multiple solutions are often acquired within the shop-floor with the aim of providing alternatives to cover a wider array of measurement needs, thus overcoming the limitations of individual instruments and technologies. In such scenarios, multisensor data fusion aims at going one step further by seeking original and different ways to analyze and combine multiple measurement datasets taken from the same measurand, in order to produce synergistic effects and ultimately obtain overall better measurement results. In this work an original approach to multisensor data fusion is presented, based on the development of Gaussian process models (the technique also known as kriging), starting from point sets acquired from multiple instruments. The approach is illustrated and validated through the application to a simulated test case and two real-life industrial metrology scenarios involving structured light scanners and coordinate measurement machines. The results show that not only the proposed approach allows for obtaining final measurement results whose metrological quality transcends that of the original single-sensor datasets, but also it allows to better characterize metrological performance and potential sources of measurement error originated from within each individual sensor

    Cyber-physical systems (CPS) in supply chain management: from foundations to practical implementation

    Get PDF
    Abstract Since 2015 developments such as Industry 4.0 and cyber-physical production systems on the technology side, and approaches such as flexible and smart manufacturing systems hold great potential. These in turn give rise to special requirements that the production planning, control and monitoring, among others, needing a paradigm shift to exploit the full potential of these methods and techniques. Starting from foundations in Cyber Physical Systems (CPS), building upon definitions and findings reported by literature, a practical example of innovative Cyber Physical Supply Chain Planning System (CPS2) is provided. The paper clarifies the advantages of cyber-physical systems in the production planning, controlling and monitoring perspective with respect to manufacturing, logistics and related planning practices. A set of basic features of CPS2 systems are discussed and addressed by contextualizing service orientation architecture and microservices components with respect to supply chain management collaboration and cooperation practices. The identification of specific technologies behind those functions, within the developed research, provides some practical insight if the interesting CPS2 potential

    A comparison study of distribution-free multivariate SPC methods for multimode data

    Get PDF
    The data-rich environments of industrial applications lead to large amounts of correlated quality characteristics that are monitored using Multivariate Statistical Process Control (MSPC) tools. These variables usually represent heterogeneous quantities that originate from one or multiple sensors and are acquired with different sampling parameters. In this framework, any assumptions relative to the underlying statistical distribution may not be appropriate, and conventional MSPC methods may deliver unacceptable performances. In addition, in many practical applications, the process switches from one operating mode to a different one, leading to a stream of multimode data. Various nonparametric approaches have been proposed for the design of multivariate control charts, but the monitoring of multimode processes remains a challenge for most of them. In this study, we investigate the use of distribution-free MSPC methods based on statistical learning tools. In this work, we compared the kernel distance-based control chart (K-chart) based on a one-class-classification variant of support vector machines and a fuzzy neural network method based on the adaptive resonance theory. The performances of the two methods were evaluated using both Monte Carlo simulations and real industrial data. The simulated scenarios include different types of out-of-control conditions to highlight the advantages and disadvantages of the two methods. Real data acquired during a roll grinding process provide a framework for the assessment of the practical applicability of these methods in multimode industrial applications
    • …
    corecore